Goto

Collaborating Authors

 shape constraint


Diminishing Returns Shape Constraints for Interpretability and Regularization

Maya Gupta, Dara Bahri, Andrew Cotter, Kevin Canini

Neural Information Processing Systems

Similarly, a model that predicts the time it will take a customer to grocery shop should decrease in the number of cashiers, but each addedcashierreduces average wait time by less. In both cases, we would like to be able to incorporate this prior knowledge by constraining the machine learned model's output to have a diminishing returns response to the size of the apartment or number of cashiers.





Deep Random Splines for Point Process Intensity Estimation of Neural Population Data

Neural Information Processing Systems

Gaussian processes are the leading class of distributions on random functions, but they suffer from well known issues including difficulty scaling and inflexibility with respect to certain shape constraints (such as nonnegativity). Here we propose Deep Random Splines, a flexible class of random functions obtained by transforming Gaussian noise through a deep neural network whose output are the parameters of a spline. Unlike Gaussian processes, Deep Random Splines allow us to readily enforce shape constraints while inheriting the richness and tractability of deep generative models. We also present an observational model for point process data which uses Deep Random Splines to model the intensity function of each point process and apply it to neural population data to obtain a low-dimensional representation of spiking activity. Inference is performed via a variational autoencoder that uses a novel recurrent encoder architecture that can handle multiple point processes as input. We use a newly collected dataset where a primate completes a pedaling task, and observe better dimensionality reduction with our model than with competing alternatives.



Kernel-Based Nonparametric Tests For Shape Constraints

Sen, Rohan

arXiv.org Machine Learning

We propose a kernel-based nonparametric framework for mean-variance optimization that enables inference on economically motivated shape constraints in finance, including positivity, monotonicity, and convexity. Many central hypotheses in financial econometrics are naturally expressed as shape relations on latent functions (e.g., term premia, CAPM relations, and the pricing kernel), yet enforcing such constraints during estimation can mask economically meaningful violations; our approach therefore separates learning from validation by first estimating an unconstrained solution and then testing shape properties. We establish statistical properties of the regularized sample estimator and derive rigorous guarantees, including asymptotic consistency, a functional central limit theorem, and a finite-sample deviation bound achieving the Monte Carlo rate up to a regularization term. Building on these results, we construct a joint Wald-type statistic to test shape constraints on finite grids. An efficient algorithm based on a pivoted Cholesky factorization yields scalability to large datasets. Numerical studies, including an options-based asset-pricing application, illustrate the usefulness of the proposed method for evaluating monotonicity and convexity restrictions.


Supplement

Neural Information Processing Systems

We provide the proof (Section A) of our main result presented in Section 3. Section B is about an additional numerical illustration in the context of kernel ridge regression on the importance of hard shape constraints in case of increasing level of noise. 'above' the affine hyperplane defined by normal vector Our results are summarized in Figure 1(b). To illustrate the tightening property of Theorem 3, i.e. that (n 1) (n 1) By construction both measures are zero for SOC. ( d 2)